4 results
Meat quality of farmed red deer fed a balanced diet: effects of supplementation with copper bolus on different muscles
- M. P. Serrano, A. Maggiolino, J. M. Lorenzo, P. De Palo, A. García, T. Landete-Castillejos, P. Gambín, J. Cappelli, R. Domínguez, F. J. Pérez-Barbería, L. Gallego
-
- Article
- Export citation
-
Supplementation with copper (Cu) improves deer antler characteristics, but it could modify meat quality and increase its Cu content to levels potentially harmful for humans. Here, we studied the effects of Cu bolus supplementation by means on quality and composition of sternocephalicus (ST) and rectus abdominis (RA) muscles (n=13 for each one) from yearling male red deer fed with a balanced diet. Each intraruminal bolus, containing 3.4 g of Cu, was administered orally in the treatment group to compare with the control group. Meat traits studied were pH at 24 h postmortem (pH24), colour, chemical composition, cholesterol content, fatty acid (FA) composition, amino acid (AA) profile and mineral content. In addition, the effect of Cu supplementation on mineral composition of liver and serum (at 0 and 90 days of treatment) was analysed. No interactions between Cu supplementation and muscle were observed for any trait. Supplementation with Cu increased the protein content of meat (P<0.01). However, Cu content of meat, liver and serum was not modified by supplementation. In fact, Cu content of meat (1.20 and 1.34 mg/kg for Cu supplemented and control deer, respectively) was much lower in both groups than 5 mg/kg of fresh weight allowed legally for food of animal origin. However, bolus of Cu tended to increase the meat content of zinc and significantly increased (P<0.05) the hepatic contents of sodium and lead. Muscles studied had different composition and characteristics. The RA muscle had significantly higher protein content (P<0.001), monounsaturated FA content (P<0.05) and essential/non-essential AA ratio (P<0.01) but lower pH24 (P<0.01) and polyunsaturated FA content (P=0.001) than the ST muscle. In addition, RA muscle had 14.4% less cholesterol (P=0.001) than ST muscle. Also, mineral profile differed between muscles with higher content of iron, significantly higher (P<0.001) content of zinc and lower content of calcium, magnesium and phosphorus (P<0.05) for ST muscle compared with RA. Therefore, supplementation with Cu modified deer meat characteristics, but it did not increase its concentration to toxic levels, making it a safe practice from this perspective. Despite the lower content of polyunsaturated FA, quality was better for RA than for ST muscle based on its higher content of protein with more essential/non-essential AA ratio and lower pH24 and cholesterol content.
The development of an intraruminal nylon bag technique using non-fistulated animals to assess the rumen degradability of dietary plant materials
- J. H. Pagella, R. W. Mayes, F. J. Pérez-Barbería, E. R. Ørskov
-
- Article
- Export citation
-
Although the conventional in situ ruminal degradability method is a relevant tool to describe the nutritional value of ruminant feeds, its need for rumen-fistulated animals may impose a restriction on its use when considering animal welfare issues and cost. The aim of the present work was to develop a ruminal degradability technique which avoids using surgically prepared animals. The concept was to orally dose a series of porous bags containing the test feeds at different times before slaughter, when the bags would be removed from the rumen for degradation measurement. Bags, smaller than those used in the conventional nylon bag technique, were made from woven nylon fabric, following two shape designs (rectangular flat shape, tetrahedral shape) and were fitted with one of three types of device for preventing their regurgitation. These bags were used in two experiments with individually housed non-pregnant, non-lactating sheep, as host animals for the in situ ruminal incubation of forage substrates. The bags were closed at the top edge by machine stitching and wrapped in tissue paper before oral dosing. Standard times for ruminal incubation of substrates in all of the tests were 4, 8, 16, 24, 48, 72 and 96 h before slaughter. The purpose of the first experiment was to compare the effectiveness of the three anti-regurgitation device designs, constructed from nylon cable ties (‘Z-shaped’, ARD1; ‘double Z-shaped’, ARD2; ‘umbrella-shaped’, ARD3), and to observe whether viable degradation curves could be generated using grass hay as the substrate. In the second experiment, three other substrates (perennial ryegrass, red clover and barley straw) were compared using flat and tetrahedral bags fitted with type ARD1 anti-regurgitation devices. Non-linear mixed-effect regression models were used to fit asymptotic exponential curves of the percentage dry matter loss of the four substrates against time of incubation in the reticulorumen, and the effect of type of anti-regurgitation device and the shape of nylon bag. All three devices were highly successful at preventing regurgitation with 93% to 100% of dosed bags being recovered in the reticulorumen at slaughter. Ruminal degradation data obtained for tested forages were in accordance with those expected from the conventional degradability technique using fistulated animals, with no significant differences in the asymptotic values of degradation curves between bag shape or anti-regurgitation device. The results of this research demonstrate the potential for using a small bag technique with intact sheep to characterise the in situ ruminal degradability of roughages.
Voluntary intake and digestibility in horses: effect of forage quality with emphasis on individual variability
- N. Edouard, G. Fleurance, W. Martin-Rosset, P. Duncan, J. P. Dulphy, S. Grange, R. Baumont, H. Dubroeucq, F. J. Pérez-Barbería, I. J. Gordon
-
- Article
- Export citation
-
Food intake is a key biological process in animals, as it determines the energy and nutrients available for the physiological and behavioural processes. In herbivores, the abundance, structure and quality of plant resources are known to influence intake strongly. In ruminants, as the forage quality declines, digestibility and total intake decline. Equids are believed to be adapted to consume high-fibre low-quality forages. As hindgut fermenters, it has been suggested that their response to a reduction in food quality is to increase intake to maintain rates of energy and nutrient absorption. All reviews of horse nutrition show that digestibility declines with forage quality; for intake, however, most studies have found no significant relationship with forage quality, and it has even been suggested that horses may eat less with declining forage quality similarly to ruminants. A weakness of these reviews is to combine data from different studies in meta-analyses without allowing the differences between animals and diets to be controlled for. In this study, we analysed a set of 45 trials where intake and digestibility were measured in 21 saddle horses. The dataset was analysed both at the group (to allow comparisons with the literature) and at the individual levels (to control for individual variability). As expected, dry matter digestibility declined with forage quality in both analyses. Intake declined slightly with increasing fibre contents at the group level, and there were no effects of crude protein or dry matter digestibility on intake. Overall, the analysis for individual horses showed a different pattern: intake increased as digestibility and crude protein declined, and increased with increasing fibre. Our analysis at the group level confirms previous reviews and shows that forage quality explains little of the variance in food intake in horses. For the first time, using mixed models, we show that the variable ‘individual’ clarifies the picture, as the horses showed different responses to a decrease in forage quality: some compensated for the low nutritional value of the forages by increasing intake, few others responded by decreasing intake with declining forage quality, but not enough to cause any deficit in their energy and protein supplies. On the whole, all the animals managed to meet their maintenance requirements. The individual variability may be a by-product of artificial selection for performance in competition in saddle horses.
The influence of molar occlusal surface area on the voluntary intake, digestion, chewing behaviour and diet selection of red deer (Cervus elaphus)
- F. J. Pérez-Barbería, I. J. Gordon
-
- Journal:
- Journal of Zoology / Volume 245 / Issue 3 / July 1998
- Published online by Cambridge University Press:
- 05 April 2001, pp. 307-316
- Print publication:
- July 1998
-
- Article
- Export citation
-
The loss of tooth effectiveness due to molar wear has been proposed as an important cause of mortality in ungulate populations. Voluntary intakes, digestibility, mean retention times, chewing behaviour (during eating and rumination), and diet selection (physical selection: short vs long particles in hay; and botanical selection: leaf vs stem) were compared in two groups of female red deer (Cervus elaphus) which differed in molar occlusal surface area, in order to test the hypothesis that behavioural and physiological mechanisms can be used to maintain assimilation efficiency in the presence of low functional tooth effectiveness. The group with lower values for first lower molar occlusal surface area (OSA) corrected for body weight had lower voluntary food intakes (P = 0.0126). The low OSA group had a greater number of chews per g DM of food ingested (P = 0.0167), a greater time spent chewing (P = 0.0476) but a lower number of chews per min than did those with a high OSA (P = 0.0484). The total number of chews per day was similar for both groups (P = 0.2011). The number of ruminating chews per day was less for the low OSA group (P = 0.0377). The group with low OSA values had a larger average particle size in their faeces (P = 0.0346). No differences were detected between groups in the physical or botanical composition of the diet selected (P = 0.3030, P = 0.3056, respectively) or in total digestibility and mean retention times (P = 0.1357, P = 0.3464, respectively). As a consequence of the lower voluntary intake the low OSA group had a lower digestible dry matter intake (P = 0.0170). This study supports the view that intake modification and the time invested in chewing during eating are the main mechanisms used to compensate for reduced chewing effectiveness associated with changes in tooth morphology, although the compensation is not total.